Superlinear Convergence of Conjugate Gradients
نویسندگان
چکیده
We give a theoretical explanation for superlinear convergence behavior observed while solving large symmetric systems of equations using the conjugate gradient method or other Krylov subspace methods. We present a new bound on the relative error after n iterations. This bound is valid in an asymptotic sense when the size N of the system grows together with the number of iterations. The bound depends on the asymptotic eigenvalue distribution and on the ratio n/N . Under appropriate conditions we show that the bound is asymptotically sharp. Our findings are related to some recent results concerning asymptotics of discrete orthogonal polynomials. An important tool in our investigations is a constrained energy problem in logarithmic potential theory. The new asymptotic bounds for the rate of convergence are illustrated by discussing Toeplitz systems as well as a model problem stemming from the discretization of the Poisson equation.
منابع مشابه
On the Superlinear Convergence of Exact and Inexact Krylov Subspace Methods
We present a general analytical model which describes the superlinear convergence of Krylov subspace methods. We take an invariant subspace approach, so that our results apply also to inexact methods, and to non-diagonalizable matrices. Thus, we provide a unified treatment of the superlinear convergence of GMRES, Conjugate Gradients, block versions of these, and inexact subspace methods. Numeri...
متن کاملOn the Superlinear Convergence of MINRES
Quantitative bounds are presented for the superlinear convergence of the MINRES method of Paige and Saunders [SIAM J. Numer. Anal., 1975] for the solution of sparse linear systems Ax = b, with A symmetric and indefinite. It is shown that the superlinear convergence is observed as soon as the harmonic Ritz values approximate well the eigenvalues of A that are either closest to zero or farthest f...
متن کاملOn the Occurrence of Superlinear Convergence of Exact and Inexact Krylov Subspace Methods
Krylov subspace methods often exhibit superlinear convergence. We present a general analytic model which describes this superlinear convergence, when it occurs. We take an invariant subspace approach, so that our results apply also to inexact methods, and to non-diagonalizable matrices. Thus, we provide a unified treatment of the superlinear convergence of GMRES, Conjugate Gradients, block vers...
متن کاملFinal Iterations in Interior Point Methods | Preconditioned Conjugate Gradients and Modiied Search Directions
In this article we consider modiied search directions in the endgame of interior point methods for linear programming. In this stage, the normal equations determining the search directions become ill-conditioned. The modiied search directions are computed by solving perturbed systems in which the systems may be solved ef-ciently by the preconditioned conjugate gradient solver. We prove the conv...
متن کاملSuperlinear Cg Convergence for Special Right-hand Sides
Recently, we gave a theoretical explanation for superlinear convergence behavior observed while solving large symmetric systems of equations using the Conjugate Gradient method. Roughly speaking, one may observe superlinear convergence while solving a sequence of (symmetric positive definite) linear systems if the asymptotic eigenvalue distribution of the sequence of the corresponding matrices ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- SIAM J. Numerical Analysis
دوره 39 شماره
صفحات -
تاریخ انتشار 2001